Sequence stacking using dual encoder Seq2Seq recurrent networks

نویسندگان

  • Alessandro Bay
  • Biswa Sengupta
چکیده

A widely studied non-polynomial (NP) hard problem lies in finding a route between the two nodes of a graph. Often meta-heuristics algorithms such asA∗ are employed on graphs with a large number of nodes. Here, we propose a deep recurrent neural network architecture based on the Sequence-2-Sequence model, widely used, for instance in text translation. Particularly, we illustrate that utilising a context vector that has been learned from two different recurrent networks enables increased accuracies in learning the shortest route of a graph. Additionally, we show that one can boost the performance of the Seq2Seq network by smoothing the loss function using a homotopy continuation of the decoder’s loss function.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

StackSeq2Seq: Dual Encoder Seq2Seq Recurrent Networks

A widely studied non-deterministic polynomial time (NP) hard problem lies in €nding a route between the two nodes of a graph. O‰en meta-heuristics algorithms such asA∗ are employed on graphs with a large number of nodes. Here, we propose a deep recurrent neural network architecture based on the Sequence-2-Sequence (Seq2Seq) model, widely used, for instance in text translation. Particularly, we ...

متن کامل

Recurrent Neural Network-Based Semantic Variational Autoencoder for Sequence-to-Sequence Learning

Sequence-to-sequence (Seq2seq) models have played an import role in the recent success of various natural language processing methods, such as machine translation, text summarization, and speech recognition. However, current Seq2seq models have trouble preserving global latent information from a long sequence of words. Variational autoencoder (VAE) alleviates this problem by learning a continuo...

متن کامل

Enterprise to Computer: Star Trek chatbot

Human interactions and human-computer interactions are strongly influenced by style as well as content. Adding a persona to a chatbot makes it more human-like and contributes to a better and more engaging user experience. In this work, we propose a design for a chatbot that captures the style of Star Trek by incorporating references from the show along with peculiar tones of the fictional chara...

متن کامل

Variational Attention for Sequence-to-Sequence Models

The variational encoder-decoder (VED) encodes source information as a set of random variables using a neural network, which in turn is decoded into target data using another neural network. In natural language processing, sequenceto-sequence (Seq2Seq) models typically serve as encoder-decoder networks. When combined with a traditional (deterministic) attention mechanism, the variational latent ...

متن کامل

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

In this work, we present a compact, modular framework for constructing new recurrent neural architectures. Our basic module is a new generic unit, the Transition Based Recurrent Unit (TBRU). In addition to hidden layer activations, TBRUs have discrete state dynamics that allow network connections to be built dynamically as a function of intermediate activations. By connecting multiple TBRUs, we...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1710.04211  شماره 

صفحات  -

تاریخ انتشار 2017